How much can we learn from nearest neighbor distributions?

نویسنده

  • Christoph Schlier
چکیده

Nearest neighbor distributions of molecular spectra can, in principle, be used to learn from quantum spectra about the classical dynamics of a system, i.e. whether it is regular or irregular (chaotic). However, the predictive power of this method is limited due to the generally small number of spectral lines available for analysis, and the ambiguities of the procedures used. This is demonstrated here for the determination of the shape of nearest neighbor distributions in terms of a Brody parameter, which was determined from fits to samples from a Brody distribution and fits to simulated molecular spectra. The procedures are also applied to computed spectra of NO2 and SO2. PACS: 33.20.Tp, 05.45.Mt *) e-mail: [email protected] -2

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data

Kernel density estimators are the basic tools for density estimation in non-parametric statistics.  The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in  which  the  bandwidth  is varied depending on the location of the sample points. In this paper‎, we  initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...

متن کامل

New fingerprint of the entanglement on the thermodynamic properties

The realization that entanglement can affect macroscopic properties of bulk solid-state systems is a challenge in physics and Chemistry. Theoretical physicists often are considered the entanglement between nearest-neighbor (NN) spins and tried to find its characterizations in terms of macroscopic thermodynamics observables as magnetization and specific heat. Here, we focus on the entanglement b...

متن کامل

Learning a Nonlinear Embedding by Preserving Class Neighbourhood Structure

We show how to pretrain and fine-tune a multilayer neural network to learn a nonlinear transformation from the input space to a lowdimensional feature space in which K-nearest neighbour classification performs well. We also show how the non-linear transformation can be improved using unlabeled data. Our method achieves a much lower error rate than Support Vector Machines or standard backpropaga...

متن کامل

BoostML: An Adaptive Metric Learning for Nearest Neighbor Classification

The nearest neighbor classification/regression technique, besides its simplicity, is one of the most widely applied and well studied techniques for pattern recognition in machine learning. A nearest neighbor classifier assumes class conditional probabilities to be locally smooth. This assumption is often invalid in high dimensions and significant bias can be introduced when using the nearest ne...

متن کامل

Large Margin Nearest Neighbor Classification using Curved Mahalanobis Distances

We consider the supervised classification problem of machine learning in Cayley-Klein projective geometries: We show how to learn a curved Mahalanobis metric distance corresponding to either the hyperbolic geometry or the elliptic geometry using the Large Margin Nearest Neighbor (LMNN) framework. We report on our experimental results, and further consider the case of learning a mixed curved Mah...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002